Release Notes for Q2 2023
Explore the new features added in this update!
Updated in: August 2023
Release Version: 0.14
Feature | Description |
---|---|
Update of source and target nodes in an existing Databricks job for data integration, data transformation, or data quality | In Data Pipeline Studio, if the source or target node changes, the node displays a Configuration Changed icon. You can edit the job with the required configuration. After you publish the pipeline with the changed configuration, the icon disappears. If required, you can run the pipeline with the existing configuration as well. This enhancement eliminates the need to delete an existing job and create a new one. |
Custom parameters are now supported in Databricks custom transformation job | Data Pipeline Studio now supports adding custom parameters in a Databricks custom transformation job. You can select the required system parameters and add user defined parameters manually or through a .json file. You can add the parameters directly from the Lazsa Platform without having to log on to the Databricks cluster. |
Logging is now enabled for a Databricks cluster configuration |
During the Databricks cluster configuration, you can now enable logging for Databricks by specifying the log path and log level. The log path can either be DBFS or an S3 bucket. You can select a log level from the following options:
|
Audit Log improvements in the Lazsa Platform | Audit logs are improved for the various capabilities of the Lazsa Platform. The improvements include logging support for additional objects, appropriate filter of object types, and improved readability of event and summary fields. |
SSL file upload now supported on Amazon S3 | SSL certificates for REST API as source were earlier uploaded on Databricks File System (DBFS), by default. Now you can also upload new or existing SSL certificates to Amazon S3. |
Pipeline run failures handled with resume pipeline features | Data Pipeline Studio now provides an option to resume pipeline from point of failure. Using this option, you can avoid a failed pipeline from running again completely. Instead, the pipeline resumes from failed nodes only, thus optimizing resource utilization. |
Downloadable reports available for Teams dashboard widgets | The data in some widgets of Teams dashboard can now be downloaded in the form of a report. You can click View User details on a bar graph of the widgets Resources by Allocation % and Resources by Number of Allocated Teams and then download the report from the specific pages. |
Mandatory custom attributes with empty fields are flagged when you edit and save a product | If you are editing a product and a mandatory custom attribute is empty (does not have any value), then the tab is flagged (displayed in Red) and the field shows a validation error. This helps the user to quickly identify the problem and fix it before proceeding to save the product details. |
More technologies integrated with the Lazsa Platform |
The following latest versions of software technologies are now supported in the Lazsa Platform:
|
Support for SonarQube 9.x | The Lazsa Platform now supports SonarQube 9.9 LTS for code quality scans in the CI/CD pipeline. |